Stochastic Compositional Gradient Descent Under Compositional Constraints

نویسندگان

چکیده

This work studies constrained stochastic optimization problems where the objective and constraint functions are convex expressed as compositions of functions. The problem arises in context fair classification, regression, design queuing systems. Of particular interest is large-scale setting an oracle provides gradients constituent functions, goal to solve with a minimal number calls oracle. Owing compositional form, provided by do not yield unbiased estimates or gradients. Instead, we construct approximate tracking inner function evaluations, resulting quasi-gradient saddle point algorithm. We prove that proposed algorithm guaranteed find optimal feasible solution almost surely. further establish requires $\mathcal {O}(1/\epsilon ^{4})$ data samples order obtain notation="LaTeX">$\epsilon$ -approximate while also ensuring zero violation. result matches sample complexity gradient descent method for unconstrained improves upon best-known results settings. efficacy tested on both classification regression problems. numerical show outperforms state-of-the-art algorithms terms convergence rate.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Compositional Gradient Descent: Algorithms for Minimizing Nonlinear Functions of Expected Values

Classical stochastic gradient methods are well suited for minimizing expected-valued objective functions. However, they do not apply to the minimization of a nonlinear function involving expected values, i.e., problems of the form minx f ( Ew[gw(x)] ) . In this paper, we propose a class of stochastic compositional gradient descent (SCGD) algorithms that can be viewed as stochastic versions of q...

متن کامل

Lucas-Kanade Inverse Compositional Using Multiple Brightness and Gradient Constraints

Abstract: A recently proposed fast image alignment algorithm is the inverse compositional algorithm based on LucasKanade. In this paper, we present an overview of different brightness and gradient constraints used with the inverse compositional algorithm. We also propose an efficient and robust data constraint for the estimation of global motion from image sequences. The constraint combines bri...

متن کامل

Cognitive Constraints on Compositional Systems

In this article I explore the relationship between composing and listening. I begin with a problematic story, draw some general conclusions, introduce relevant concepts from Lerdahl and Jackendoff (1983) and related work, propose some cognitive constraints on compositional systems, discuss "pitch space", and explain why serial (or 12-tone) organizations are cognitively opaque. Most of these top...

متن کامل

Variational Stochastic Gradient Descent

In Bayesian approach to probabilistic modeling of data we select a model for probabilities of data that depends on a continuous vector of parameters. For a given data set Bayesian theorem gives a probability distribution of the model parameters. Then the inference of outcomes and probabilities of new data could be found by averaging over the parameter distribution of the model, which is an intr...

متن کامل

Byzantine Stochastic Gradient Descent

This paper studies the problem of distributed stochastic optimization in an adversarial setting where, out of the m machines which allegedly compute stochastic gradients every iteration, an α-fraction are Byzantine, and can behave arbitrarily and adversarially. Our main result is a variant of stochastic gradient descent (SGD) which finds ε-approximate minimizers of convex functions in T = Õ ( 1...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Signal Processing

سال: 2023

ISSN: ['1053-587X', '1941-0476']

DOI: https://doi.org/10.1109/tsp.2023.3244326